Faster Subgradient Methods for Functions with Hölderian Growth

نویسندگان

  • Patrick R. Johnstone
  • Pierre Moulin
چکیده

The purpose of this manuscript is to derive new convergence results for several subgradient methods for minimizing nonsmooth convex functions with Hölderian growth. The growth condition is satisfied in many applications and includes functions with quadratic growth and functions with weakly sharp minima as special cases. To this end there are four main contributions. First, for a constant and sufficiently small stepsize, we show that the subgradient method achieves linear convergence up to a certain region including the optimal set with error of the order of the stepsize. Second, we derive nonergodic convergence rates for the subgradient method under nonsummable decaying stepsizes. Thirdly if appropriate problem parameters are known we derive a possibly-summable stepsize which obtains a much faster convergence rate. Finally we develop a novel “descending stairs” stepsize which obtains this faster convergence rate but also obtains linear convergence for the special case of weakly sharp functions. We also develop a variant of the “descending stairs” stepsize which achieves essentially the same convergence rate without requiring an error bound constant which is difficult to estimate in practice.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence Rates for Deterministic and Stochastic Subgradient Methods Without Lipschitz Continuity

We extend the classic convergence rate theory for subgradient methods to apply to non-Lipschitz functions. For the deterministic projected subgradient method, we present a global O(1/ √ T ) convergence rate for any convex function which is locally Lipschitz around its minimizers. This approach is based on Shor’s classic subgradient analysis and implies generalizations of the standard convergenc...

متن کامل

Stochastic Convex Optimization: Faster Local Growth Implies Faster Global Convergence

In this paper, a new theory is developed for firstorder stochastic convex optimization, showing that the global convergence rate is sufficiently quantified by a local growth rate of the objective function in a neighborhood of the optimal solutions. In particular, if the objective function F (w) in the -sublevel set grows as fast as ‖w − w∗‖ 2 , where w∗ represents the closest optimal solution t...

متن کامل

Proximal point algorithms for nonsmooth convex optimization with fixed point constraints

The problem of minimizing the sum of nonsmooth, convex objective functions defined on a real Hilbert space over the intersection of fixed point sets of nonexpansive mappings, onto which the projections cannot be efficiently computed, is considered. The use of proximal point algorithms that use the proximity operators of the objective functions and incremental optimization techniques is proposed...

متن کامل

A Bundle Method for Hydrothermal Scheduling

Lagrangian relaxation has been widely used in hydrothermal scheduling. Complicating constraints are relaxed by multipliers which are usually updated by a subgradient method (SGM). The SGM suffers from slow convergence caused by the non-differentiable characteristics of dual functions. This paper presents an algorithm that utilizes the Bundle Trust Region Method (BTRM) to update the multipliers ...

متن کامل

SOLVING FUZZY LINEAR PROGRAMMING PROBLEMS WITH LINEAR MEMBERSHIP FUNCTIONS-REVISITED

Recently, Gasimov and Yenilmez proposed an approach for solving two kinds of fuzzy linear programming (FLP) problems. Through the approach, each FLP problem is first defuzzified into an equivalent crisp problem which is non-linear and even non-convex. Then, the crisp problem is solved by the use of the modified subgradient method. In this paper we will have another look at the earlier defuzzifi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1704.00196  شماره 

صفحات  -

تاریخ انتشار 2017